Natural Language Generation in Dialogue using Lexicalized and Delexicalized Data

نویسندگان

  • Shikhar Sharma
  • Jing He
  • Kaheer Suleman
  • Hannes Schulz
  • Philip Bachman
چکیده

Natural language generation plays a critical role in spoken dialogue systems. We present a new approach to natural language generation for task-oriented dialogue using recurrent neural networks in an encoder-decoder framework. In contrast to previous work, our model uses both lexicalized and delexicalized components i.e. slot-value pairs for dialogue acts, with slots and corresponding values aligned together. This allows our model to learn from all available data including the slot-value pairing, rather than being restricted to delexicalized slots. We show that this helps our model generate more natural sentences with better grammar. We further improve our model’s performance by transferring weights learnt from a pretrained sentence auto-encoder. Human evaluation of our best-performing model indicates that it generates sentences which users find more appealing. 1 I N T R O D U C T I O N Traditionally, task-oriented spoken dialogue systems (SDS) rely on template-based, hand-crafted rules for natural language generation (NLG). However, this approach does not scale well to complex domains and datasets. Previous papers have explored alternatives using corpus-based n-gram models (Oh & Rudnicky, 2002), tree-based models (Walker et al., 2007), SVM rerankers (Kondadadi et al., 2013), and Reinforcement Learning models (Rieser & Lemon, 2010). Recently, models based on recurrent neural networks (RNNs) have been successfully applied to natural language generation tasks such as image captioning (Xu et al., 2015; Karpathy & Li, 2015), video description (Yao et al., 2015; Sharma, 2016), and machine translation (Bahdanau et al., 2015). In the domain of task-oriented SDS, RNN-based models have been used for NLG in both traditional multi-component processing pipelines (Wen et al., 2015a;b) and more recent systems designed for end-to-end training (Wen et al., 2017). In the context of task-oriented dialog systems, the NLG task consists of translating one or multiple dialog act slot-value pairs, i.e. (INFORM-NAME, Super Ramen), (INFORM-AREA, near the plaza) into a well-formed sentence (Rajkumar et al., 2009) such as “Super Ramen is located near the plaza”. Existing RNN-based models (Wen et al., 2015a) tackle this problem by relying only on the delexicalized part of the act slot-value pairs, i.e. the model only considers the act and slot (e.g. INFORM-NAME) and ignores the lexical values (e.g. Super Ramen). Wen et al. (2015b) propose a model that can use lexicalized values. However, since they do not align slots with their values, the model has no way of knowing which value corresponds to which slot. These methods ignore linguistic relationships in the lexicalized part of a slot-value pair (e.g. between the words “near”, “the”, and “plaza”), and between the lexicalized part and its surrounding context (e.g. between “located” and “near”). As illustrated in Figure 1, ignoring these often leads to grammatically incorrect sentences. In this paper, we develop an RNN-based approach which considers both lexicalized and delexicalized dialogue act slot-value pairs. Our model outperforms existing approaches measured in both automated (BLEU-4 (Papineni et al., 2002), METEOR (Lavie & Agarwal, 2007), ROUGE (Lin, 2004), ∗This work was done while Jing He was at Maluuba (now Microsoft Maluuba) 1 ar X iv :1 60 6. 03 63 2v 3 [ cs .C L ] 2 1 A pr 2 01 7 Workshop track ICLR 2017

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

To Plan or not to Plan? Discourse Planning in Slot-Value Informed Sequence to Sequence Models for Language Generation

Natural language generation for task-oriented dialogue systems aims to effectively realize system dialogue actions. All natural language generators (NLGs) must realize grammatical, natural and appropriate output, but in addition, generators for taskoriented dialogue must faithfully perform a specific dialogue act that conveys specific semantic information, as dictated by the dialogue policy of ...

متن کامل

The Importance of Lexicalized Syntax Models for Natural Language Generation Tasks

The parsing community has long recognized the importance of lexicalized models of syntax. By contrast, these models do not appear to have had an impact on the statistical NLG community. To prove their importance in NLG, we show that a lexicalized model of syntax improves the performance of a statistical text compression system, and show results that suggest it would also improve the performance...

متن کامل

Lexicalized vs. Delexicalized Parsing in Low-Resource Scenarios

We present a systematic analysis of lexicalized vs. delexicalized parsing in lowresource scenarios, and propose a methodology to choose one method over another under certain conditions. We create a set of simulation experiments on 41 languages and apply our findings to 9 lowresource languages. Experimental results show that our methodology chooses the best approach in 8 out of 9 cases.

متن کامل

Delexicalized Word Embeddings for Cross-lingual Dependency Parsing

This paper presents a new approach to the problem of cross-lingual dependency parsing, aiming at leveraging training data from different source languages to learn a parser in a target language. Specifically, this approach first constructs word vector representations that exploit structural (i.e., dependency-based) contexts but only considering the morpho-syntactic information associated with ea...

متن کامل

Parsing strategy for spoken language interfaces with a lexicalized tree grammar

Our work addresses the integration of speech recognition and natural language processing for spoken dialogue systems. To deal with recognition errors, we investigate two repairing strategies integrated in a parsing based on a Lexicalized Tree Grammar. The rst strategy takes its root in the recognition hypothesis, the other in the linguistic expectations. We expose a formal framework to express ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1606.03632  شماره 

صفحات  -

تاریخ انتشار 2016